robot exclusion standard - significado y definición. Qué es robot exclusion standard
Diclib.com
Diccionario ChatGPT
Ingrese una palabra o frase en cualquier idioma 👆
Idioma:

Traducción y análisis de palabras por inteligencia artificial ChatGPT

En esta página puede obtener un análisis detallado de una palabra o frase, producido utilizando la mejor tecnología de inteligencia artificial hasta la fecha:

  • cómo se usa la palabra
  • frecuencia de uso
  • se utiliza con más frecuencia en el habla oral o escrita
  • opciones de traducción
  • ejemplos de uso (varias frases con traducción)
  • etimología

Qué (quién) es robot exclusion standard - definición

STANDARD USED TO ADVISE WEB CRAWLERS AND SCRAPERS NOT TO INDEX A WEB PAGE OR SITE
Robots exclusion standard; Robots.txt protocol; Robots exclusion file; Robots exclusion protocol; Standard for Robot Exclusion; Robot Exclusion Standard; Robot Exclusion Protocol; Robot.txt; Robots Exclusion Standard; Robots.tx; Robot exclusion standard; ROBOTS.TXT; Humans.txt; Killer-robots.txt

Robots exclusion standard         
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.
robot exclusion standard         
standard for robot exclusion         
<World-Wide Web> A proposal to try to prevent the havoc wreaked by many of the early World-Wide Web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called "robots.txt" placed in the {document root} of the web site. {W3C standard (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1)}. (2006-10-17)

Wikipedia

Robots.txt

robots.txt is a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.

This relies on voluntary compliance. Not all robots comply with the standard; email harvesters, spambots, malware and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out.

The "robots.txt" file can be used in conjunction with sitemaps, another robot inclusion standard for websites.